1,099 research outputs found

    Discovering cultural differences (and similarities) in facial expressions of emotion

    Get PDF
    Understanding the cultural commonalities and specificities of facial expressions of emotion remains a central goal of Psychology. However, recent progress has been stayed by dichotomous debates (e.g., nature versus nurture) that have created silos of empirical and theoretical knowledge. Now, an emerging interdisciplinary scientific culture is broadening the focus of research to provide a more unified and refined account of facial expressions within and across cultures. Specifically, data-driven approaches allow a wider, more objective exploration of face movement patterns that provide detailed information ontologies of their cultural commonalities and specificities. Similarly, a wider exploration of the social messages perceived from face movements diversifies knowledge of their functional roles (e.g., the ‘fear’ face used as a threat display). Together, these new approaches promise to diversify, deepen, and refine knowledge of facial expressions, and deliver the next major milestones for a functional theory of human social communication that is transferable to social robotics

    Toward a social psychophysics of face communication

    Get PDF
    As a highly social species, humans are equipped with a powerful tool for social communication—the face, which can elicit multiple social perceptions in others due to the rich and complex variations of its movements, morphology, and complexion. Consequently, identifying precisely what face information elicits different social perceptions is a complex empirical challenge that has largely remained beyond the reach of traditional research methods. More recently, the emerging field of social psychophysics has developed new methods designed to address this challenge. Here, we introduce and review the foundational methodological developments of social psychophysics, present recent work that has advanced our understanding of the face as a tool for social communication, and discuss the main challenges that lie ahead

    Cultural differences in the decoding and representation of facial expression signals

    Get PDF
    Summary. In this thesis, I will challenge one of the most fundamental assumptions of psychological science – the universality of facial expressions. I will do so by first reviewing the literature to reveal major flaws in the supporting arguments for universality. I will then present new data demonstrating how culture has shaped the decoding and transmission of facial expression signals. A summary of both sections are presented below. Review of the Literature To obtain a clear understanding of how the universality hypothesis developed, I will present the historical course of the emotion literature, reviewing relevant works supporting notions of a ‘universal language of emotion.’ Specifically, I will examine work on the recognition of facial expressions across cultures as it constitutes a main component of the evidence for universality. First, I will reveal that a number of ‘seminal’ works supporting the universality hypothesis are critically flawed, precluding them from further consideration. Secondly, by questioning the validity of the statistical criteria used to demonstrate ‘universal recognition,’ I will show that long-standing claims of universality are both misleading and unsubstantiated. On a related note, I will detail the creation of the ‘universal’ facial expression stimulus set (Facial Action Coding System -FACS- coded facial expressions) to reveal that it is in fact a biased, culture-specific representation of Western facial expressions of emotion. The implications for future cross-cultural work are discussed in relation to the limited FACS-coded stimulus set. Experimental Work In reviewing the literature, I will reveal a latent phenomenon which has so far remained unexplained – the East Asian (EA) recognition deficit. Specifically, EA observers consistently perform significantly poorer when categorising certain ‘universal’ facial expressions compared to Western Caucasian (WC) observers – a surprisingly neglected finding given the importance of emotion communication for human social interaction. To address this neglected issue, I examined both the decoding and transmission of facial expression signals in WC and EA observers. Experiment 1: Cultural Decoding of ‘Universal’ Facial Expressions of Emotion To examine the decoding of ‘universal’ facial expressions across cultures, I used eye tracking technology to record the eye movements of WC and EA observers while they categorised the 6 ‘universal’ facial expressions of emotion. My behavioural results demonstrate the robustness of the phenomenon by replicating the EA recognition deficit (i.e., EA observers are significantly poorer at recognizing facial expressions of ‘fear’ and ‘disgust’). Further inspection of the data also showed that EA observers systematically miscategorise ‘fear’ as ‘surprise’ and ‘disgust’ as ‘anger.’ Using spatio-temporal analyses of fixations, I will show that WC and EA observers use culture-specific fixation strategies to decode ‘universal’ facial expressions of emotion. Specifically, while WC observers distribute fixations across the face, sampling the eyes and mouth, EA observers persistently bias fixations towards the eyes and neglect critical features, especially for facial expressions eliciting significant confusion (i.e., ‘fear,’ ‘disgust,’ and ‘anger’). My behavioural data showed that EA observers systematically miscategorise ‘fear’ as ‘surprise’ and ‘disgust’ as ‘anger.’ Analysis of my eye movement data also showed that EA observers repetitively sample information from the eye region during facial expression decoding, particularly for those eliciting significant behavioural confusions (i.e., ‘fear,’ ‘disgust,’ and ‘anger’). To objectively examine whether the EA culture-specific fixation pattern could give rise to the reported behavioural confusions, I built a model observer that samples information from the face to categorise facial expressions. Using this model observer, I will show that the EA decoding strategy is inadequate to distinguish ‘fear’ from ‘surprise’ and ‘disgust’ from ‘anger,’ thus giving rise to the reported EA behavioural confusions. For the first time, I will reveal the origins of a latent phenomenon - the EA recognition deficit. I discuss the implications of culture-specific decoding strategies during facial expression categorization in light of current theories of cross-cultural emotion communication. Experiment 2: Cultural Internal Representations of Facial Expressions of Emotion In the previous two experiments, I presented data that questions the universality of facial expressions. As replicated in Experiment 1, WC and EA observers differ significantly in their recognition performance for certain ‘universal’ facial expressions. In Experiment 1, I showed culture-specific fixation patterns, demonstrating cultural differences in the predicted locations of diagnostic information. Together, these data predict cultural specificity in facial expression signals, supporting notions of cultural ‘accents’ and/or ‘dialects.’ To examine whether facial expression signals differ across cultures, I used a powerful reverse correlation (RC) technique to reveal the internal representations of the 6 ‘basic’ facial expressions of emotion in WC and EA observers. Using complementary statistical image processing techniques to examine the signal properties of each internal representation, I will directly reveal cultural specificity in the representations of the 6 ‘basic’ facial expressions of emotion. Specifically, I will show that while WC representations of facial expressions predominantly featured the eyebrows and mouth, EA representations were biased towards the eyes, as predicted by my eye movement data in Experiment 1. I will also show gaze avoidance as unique feature of the EA group. In sum, this data shows clear cultural contrasts in facial expression signals by showing that culture shapes the internal representations of emotion. Future Work My review of the literature will show that pivotal concepts such as ‘recognition’ and ‘universality’ are currently flawed and have misled both the interpretation of empirical work the direction of theoretical developments. Here, I will examine each concept in turn and propose more accurate criteria with which to demonstrate ‘universal recognition’ in future studies. In doing so, I will also detail possible future studies designed to address current gaps in knowledge created by use of inappropriate criteria. On a related note, having questioned the validity of FACS-coded facial expressions as ‘universal’ facial expressions, I will highlight an area for empirical development – the creation of a culturally valid facial expression stimulus set – and detail future work required to address this question. Finally, I will discuss broader areas of interest (i.e., lexical structure of emotion) which could elevate current knowledge of cross-cultural facial expression recognition and emotion communication in the future

    Four not six: revealing culturally common facial expressions of emotion

    Get PDF
    As a highly social species, humans generate complex facial expressions to communicate a diverse range of emotions. Since Darwin’s work, identifying amongst these complex patterns which are common across cultures and which are culture-specific has remained a central question in psychology, anthropology, philosophy, and more recently machine vision and social robotics. Classic approaches to addressing this question typically tested the cross-cultural recognition of theoretically motivated facial expressions representing six emotions, and reported universality. Yet, variable recognition accuracy across cultures suggests a narrower cross-cultural communication, supported by sets of simpler expressive patterns embedded in more complex facial expressions. We explore this hypothesis by modelling the facial expressions of over 60 emotions across two cultures, and segregating out the latent expressive patterns. Using a multi-disciplinary approach, we first map the conceptual organization of a broad spectrum of emotion words by building semantic networks in two cultures. For each emotion word in each culture, we then model and validate its corresponding dynamic facial expression, producing over 60 culturally valid facial expression models. We then apply to the pooled models a multivariate data reduction technique, revealing four latent and culturally common facial expression patterns that each communicates specific combinations of valence, arousal and dominance. We then reveal the face movements that accentuate each latent expressive pattern to create complex facial expressions. Our data questions the widely held view that six facial expression patterns are universal, instead suggesting four latent expressive patterns with direct implications for emotion communication, social psychology, cognitive neuroscience, and social robotics

    Reverse Engineering Psychologically Valid Facial Expressions of Emotion into Social Robots

    Get PDF
    Social robots are now part of human society, destined for schools, hospitals, and homes to perform a variety of tasks. To engage their human users, social robots must be equipped with the essential social skill of facial expression communication. Yet, even state-of-the-art social robots are limited in this ability because they often rely on a restricted set of facial expressions derived from theory with well-known limitations such as lacking naturalistic dynamics. With no agreed methodology to objectively engineer a broader variance of more psychologically impactful facial expressions into the social robots' repertoire, human-robot interactions remain restricted. Here, we address this generic challenge with new methodologies that can reverse-engineer dynamic facial expressions into a social robot head. Our data-driven, user-centered approach, which combines human perception with psychophysical methods, produced highly recognizable and human-like dynamic facial expressions of the six classic emotions that generally outperformed state-of-art social robot facial expressions. Our data demonstrates the feasibility of our method applied to social robotics and highlights the benefits of using a data-driven approach that puts human users as central to deriving facial expressions for social robots. We also discuss future work to reverse-engineer a wider range of socially relevant facial expressions including conversational messages (e.g., interest, confusion) and personality traits (e.g., trustworthiness, attractiveness). Together, our results highlight the key role that psychology must continue to play in the design of social robots

    Revealing the information contents of memory within the stimulus information representation framework

    Get PDF
    The information contents of memory are the cornerstone of the most influential models in cognition. To illustrate, consider that in predictive coding, a prediction implies that specific information is propagated down from memory through the visual hierarchy. Likewise, recognizing the input implies that sequentially accrued sensory evidence is successfully matched with memorized information (categorical knowledge). Although the existing models of prediction, memory, sensory representation and categorical decision are all implicitly cast within an information processing framework, it remains a challenge to precisely specify what this information is, and therefore where, when and how the architecture of the brain dynamically processes it to produce behaviour. Here, we review a framework that addresses these challenges for the studies of perception and categorization–stimulus information representation (SIR). We illustrate how SIR can reverse engineer the information contents of memory from behavioural and brain measures in the context of specific cognitive tasks that involve memory. We discuss two specific lessons from this approach that generally apply to memory studies: the importance of task, to constrain what the brain does, and of stimulus variations, to identify the specific information contents that are memorized, predicted, recalled and replayed

    Data-Driven Methods to Diversify Knowledge of Human Psychology

    Get PDF
    open access articlePsychology aims to understand real human behavior. However, cultural biases in the scientific process can constrain knowledge. We describe here how data-driven methods can relax these constraints to reveal new insights that theories can overlook. To advance knowledge we advocate a symbiotic approach that better combines data-driven methods with theory

    Equipping Social Robots with Culturally-Sensitive Facial Expressions of Emotion Using Data-Driven Methods

    Get PDF
    Social robots must be able to generate realistic and recognizable facial expressions to engage their human users. Many social robots are equipped with standardized facial expressions of emotion that are widely considered to be universally recognized across all cultures. However, mounting evidence shows that these facial expressions are not universally recognized - for example, they elicit significantly lower recognition accuracy in East Asian cultures than they do in Western cultures. Therefore, without culturally sensitive facial expressions, state-of-the-art social robots are restricted in their ability to engage a culturally diverse range of human users, which in turn limits their global marketability. To develop culturally sensitive facial expressions, novel data-driven methods are used to model the dynamic face movement patterns that convey basic emotions (e.g., happy, sad, anger) in a given culture using cultural perception. Here, we tested whether such dynamic facial expression models, derived in an East Asian culture and transferred to a popular social robot, improved the social signalling generation capabilities of the social robot with East Asian participants. Results showed that, compared to the social robot's existing set of facial `universal' expressions, the culturally-sensitive facial expression models are recognized with generally higher accuracy and judged as more human-like by East Asian participants. We also detail the specific dynamic face movements (Action Units) that are associated with high recognition accuracy and judgments of human-likeness, including those that further boost performance. Our results therefore demonstrate the utility of using data-driven methods that employ human cultural perception to derive culturally-sensitive facial expressions that improve the social face signal generation capabilities of social robots. We anticipate that these methods will continue to inform the design of social robots and broaden their usability and global marketability

    Dynamic facial expressions of emotion transmit an evolving hierarchy of signals over time

    Get PDF
    Designed by biological and social evolutionary pressures, facial expressions of emotion comprise specific facial movements to support a near-optimal system of signaling and decoding. Although highly dynamical, little is known about the form and function of facial expression temporal dynamics. Do facial expressions transmit diagnostic signals simultaneously to optimize categorization of the six classic emotions, or sequentially to support a more complex communication system of successive categorizations over time? Our data support the latter. Using a combination of perceptual expectation modeling, information theory, and Bayesian classifiers, we show that dynamic facial expressions of emotion transmit an evolving hierarchy of “biologically basic to socially specific” information over time. Early in the signaling dynamics, facial expressions systematically transmit few, biologically rooted face signals supporting the categorization of fewer elementary categories (e.g., approach/avoidance). Later transmissions comprise more complex signals that support categorization of a larger number of socially specific categories (i.e., the six classic emotions). Here, we show that dynamic facial expressions of emotion provide a sophisticated signaling system, questioning the widely accepted notion that emotion communication is comprised of six basic (i.e., psychologically irreducible) categories, and instead suggesting four
    • 

    corecore